14 research outputs found

    On Collaboration in Distributed Parameter Estimation with Resource Constraints

    Full text link
    We study sensor/agent data collection and collaboration policies for parameter estimation, accounting for resource constraints and correlation between observations collected by distinct sensors/agents. Specifically, we consider a group of sensors/agents each samples from different variables of a multivariate Gaussian distribution and has different estimation objectives, and we formulate a sensor/agent's data collection and collaboration policy design problem as a Fisher information maximization (or Cramer-Rao bound minimization) problem. When the knowledge of correlation between variables is available, we analytically identify two particular scenarios: (1) where the knowledge of the correlation between samples cannot be leveraged for collaborative estimation purposes and (2) where the optimal data collection policy involves investing scarce resources to collaboratively sample and transfer information that is not of immediate interest and whose statistics are already known, with the sole goal of increasing the confidence on the estimate of the parameter of interest. When the knowledge of certain correlation is unavailable but collaboration may still be worthwhile, we propose novel ways to apply multi-armed bandit algorithms to learn the optimal data collection and collaboration policy in our distributed parameter estimation problem and demonstrate that the proposed algorithms, DOUBLE-F, DOUBLE-Z, UCB-F, UCB-Z, are effective through simulations

    Estimating Self-Sustainability in Peer-to-Peer Swarming Systems

    Full text link
    Peer-to-peer swarming is one of the \emph{de facto} solutions for distributed content dissemination in today's Internet. By leveraging resources provided by clients, swarming systems reduce the load on and costs to publishers. However, there is a limit to how much cost savings can be gained from swarming; for example, for unpopular content peers will always depend on the publisher in order to complete their downloads. In this paper, we investigate this dependence. For this purpose, we propose a new metric, namely \emph{swarm self-sustainability}. A swarm is referred to as self-sustaining if all its blocks are collectively held by peers; the self-sustainability of a swarm is the fraction of time in which the swarm is self-sustaining. We pose the following question: how does the self-sustainability of a swarm vary as a function of content popularity, the service capacity of the users, and the size of the file? We present a model to answer the posed question. We then propose efficient solution methods to compute self-sustainability. The accuracy of our estimates is validated against simulation. Finally, we also provide closed-form expressions for the fraction of time that a given number of blocks is collectively held by peers.Comment: 27 pages, 5 figure

    Online estimating the k central nodes of a network

    No full text
    Estimating the most influential nodes in a network is a fundamental problem in network analysis. Influential nodes may be important spreaders of diseases in biological networks, key actors in terrorist networks, or marketing targets in socia

    Content Availability and Bundling in Swarming Systems

    No full text
    BitTorrent, the immensely popular file swarming system, suffers a fundamental problem: content unavailability. Although swarming scales well to tolerate flash crowds for popular content, it is less useful for unpopular content as peers arriving after the initial rush find the content unavailable. Our primary contribution is a model to quantify content availability in swarming systems. We use the model to analyze the availability and the performance implications of bundling, a strategy commonly adopted by many BitTorrent publishers today. We find that even a limited amount of bundling exponentially reduces content unavailability. Surprisingly, for swarms with highly unavailable publishers, the availability gain of bundling can result in a net improvement in download time, i.e., peers obtain more content in less time. We empirically confirm the model’s conclusions through experiments on PlanetLab using the mainline Bit-Torrent client
    corecore